07. Not sure where to start?

Not sure where to start?
If you're not sure where to start, here are some suggestions for how to make some progress with the project. You need not follow this advice; these are only suggestions, and you should follow whatever path works best for you!
## Step 1: Master the details of the Deep Deterministic Policy Gradients (DDPG) algorithm.
Read the DDPG paper to master all of the details. Focus on the information in 3. Algorithm and 7. Experiment Details to learn how to adapt the implementation for your task. Refer to the lesson on Actor-Critic Methods to cement your understanding. If you have any questions, post them in Slack!
## Step 2: Study the coding exercise from the lesson.
In the Actor-Critic Methods lesson, you applied a DDPG implementation to an OpenAI Gym task. Take the time to understand this code in great detail. Tweak the various hyperparameters and settings to build your intuition for what should work well (and what doesn't!).
## Step 3: Adapt the code from the lesson to the project.
Adapt the code from the exercise to the project, while making as few modifications as possible. Don't worry about efficiency, and just make sure the code runs. Don't worry about modifying hyperparameters, optimizers, or anything else of that nature just yet.
For this step, you do not need to run your code on a GPU. In particular, if working in the Udacity-provided Workspace, GPU should not be enabled. Save your GPU hours for the next step!
## Step 4: Optimize the hyperparameters.
After you have verified that your DDPG code runs, try a few long training sessions while running your code on CPU. If your agent fails to learn, try out a few potential solutions by modifying your code. Once you're feeling confident (or impatient :)) try out your implementation with GPU support!
## Step 5: Continue to explore!
Read this paper, which evaluates the performance of various deep RL algorithms on continuous control tasks. The paper introduces REINFORCE, TNPG, RWR, REPS, TRPO, CEM, CMA-ES and DDPG, and provides some useful suggestions that will help you to figure out which are best suited for the project.